# Low Resource Requirements
Bonsai
Bonsai is a small ternary-weighted language model with 500 million parameters, built on the Llama architecture and using the Mistral tokenizer, trained on fewer than 5 billion tokens.
Large Language Model
Transformers

B
deepgrove
113
8
Whisper Custom Small
Apache-2.0
A small speech recognition model based on the OpenAI Whisper architecture, focused on English speech-to-text tasks.
Speech Recognition English
W
gyrroa
15
1
Mythomax L2 13b Q4 K M GGUF
Other
MythoMax L2 13b is a large language model based on the Q4_K_M quantized version, suitable for text generation tasks.
Large Language Model English
M
Clevyby
1,716
2
Tinyllama 1.1B Step 50K 105b
Apache-2.0
TinyLlama is a 1.1B parameter Llama model, planned to be pretrained on 3 trillion tokens, optimized to complete training in 90 days on 16 A100-40G GPUs.
Large Language Model
Transformers English

T
TinyLlama
14.41k
133
Rubert Tiny Squad
MIT
A Russian Q&A model fine-tuned from cointegrated/rubert-tiny2, suitable for SQuAD-format question answering tasks
Question Answering System
Transformers

R
Den4ikAI
32
0
Distilbert Base Uncased Finetuned Emotion Test 01
Apache-2.0
A lightweight text sentiment classification model based on DistilBERT, fine-tuned on the emotion dataset
Text Classification
Transformers

D
lewtun
15
0
Distilbert Base Uncased Finetuned Ner
Apache-2.0
A lightweight model fine-tuned on the NER task based on the DistilBERT-base-uncased model
Sequence Labeling
Transformers

D
akshaychaudhary
15
0
Featured Recommended AI Models